171 research outputs found
Optimization Under Uncertainty Using the Generalized Inverse Distribution Function
A framework for robust optimization under uncertainty based on the use of the
generalized inverse distribution function (GIDF), also called quantile
function, is here proposed. Compared to more classical approaches that rely on
the usage of statistical moments as deterministic attributes that define the
objectives of the optimization process, the inverse cumulative distribution
function allows for the use of all the possible information available in the
probabilistic domain. Furthermore, the use of a quantile based approach leads
naturally to a multi-objective methodology which allows an a-posteriori
selection of the candidate design based on risk/opportunity criteria defined by
the designer. Finally, the error on the estimation of the objectives due to the
resolution of the GIDF will be proven to be quantifiableComment: 20 pages, 25 figure
Geometrical Insights for Implicit Generative Modeling
Learning algorithms for implicit generative models can optimize a variety of
criteria that measure how the data distribution differs from the implicit model
distribution, including the Wasserstein distance, the Energy distance, and the
Maximum Mean Discrepancy criterion. A careful look at the geometries induced by
these distances on the space of probability measures reveals interesting
differences. In particular, we can establish surprising approximate global
convergence guarantees for the -Wasserstein distance,even when the
parametric generator has a nonconvex parametrization.Comment: this version fixes a typo in a definitio
Visualizing the Feature Importance for Black Box Models
In recent years, a large amount of model-agnostic methods to improve the
transparency, trustability and interpretability of machine learning models have
been developed. We introduce local feature importance as a local version of a
recent model-agnostic global feature importance method. Based on local feature
importance, we propose two visual tools: partial importance (PI) and individual
conditional importance (ICI) plots which visualize how changes in a feature
affect the model performance on average, as well as for individual
observations. Our proposed methods are related to partial dependence (PD) and
individual conditional expectation (ICE) plots, but visualize the expected
(conditional) feature importance instead of the expected (conditional)
prediction. Furthermore, we show that averaging ICI curves across observations
yields a PI curve, and integrating the PI curve with respect to the
distribution of the considered feature results in the global feature
importance. Another contribution of our paper is the Shapley feature
importance, which fairly distributes the overall performance of a model among
the features according to the marginal contributions and which can be used to
compare the feature importance across different models.Comment: To Appear in Machine Learning and Knowledge Discovery in Databases:
European Conference, ECML PKDD 2018, Dublin, Ireland, September 10 to 14,
2018, Proceedings, Part
Quantum encryption with certified deletion
Given a ciphertext, is it possible to prove the deletion of the underlying
plaintext? Since classical ciphertexts can be copied, clearly such a feat is
impossible using classical information alone. In stark contrast to this, we
show that quantum encodings enable certified deletion. More precisely, we show
that it is possible to encrypt classical data into a quantum ciphertext such
that the recipient of the ciphertext can produce a classical string which
proves to the originator that the recipient has relinquished any chance of
recovering the plaintext should the decryption key be revealed. Our scheme is
feasible with current quantum technology: the honest parties only require
quantum devices for single-qubit preparation and measurements; the scheme is
also robust against noise in these devices. Furthermore, we provide an analysis
that is suitable in the finite-key regime.Comment: 28 pages, 1 figure. Some technical details modifie
A vine copula mixed effect model for trivariate meta-analysis of diagnostic test accuracy studies accounting for disease prevalence
A bivariate copula mixed model has been recently proposed to synthesize diagnostic test accuracy studies and it has been shown that it is superior to the standard generalized linear mixed model in this context. Here, we call trivariate vine copulas to extend the bivariate meta-analysis of diagnostic test accuracy studies by accounting for disease prevalence. Our vine copula mixed model includes the trivariate generalized linear mixed model as a special case and can also operate on the original scale of sensitivity, specificity, and disease prevalence. Our general methodology is illustrated by re-analyzing the data of two published meta-analyses. Our study suggests that there can be an improvement on trivariate generalized linear mixed model in fit to data and makes the argument for moving to vine copula random effects models especially because of their richness, including reflection asymmetric tail dependence, and computational feasibility despite their three dimensionality
Tight Finite-Key Analysis for Quantum Cryptography
Despite enormous progress both in theoretical and experimental quantum
cryptography, the security of most current implementations of quantum key
distribution is still not established rigorously. One of the main problems is
that the security of the final key is highly dependent on the number, M, of
signals exchanged between the legitimate parties. While, in any practical
implementation, M is limited by the available resources, existing security
proofs are often only valid asymptotically for unrealistically large values of
M. Here, we demonstrate that this gap between theory and practice can be
overcome using a recently developed proof technique based on the uncertainty
relation for smooth entropies. Specifically, we consider a family of
Bennett-Brassard 1984 quantum key distribution protocols and show that security
against general attacks can be guaranteed already for moderate values of M.Comment: 11 pages, 2 figure
Experimental measurement-device-independent quantum digital signatures
The development of quantum networks will be paramount towards practical and secure telecommunications. These networks will need to sign and distribute information between many parties with information-Theoretic security, requiring both quantum digital signatures (QDS) and quantum key distribution (QKD). Here, we introduce and experimentally realise a quantum network architecture, where the nodes are fully connected using a minimum amount of physical links. The central node of the network can act either as a totally untrusted relay, connecting the end users via the recently introduced measurement-device-independent (MDI)-QKD, or as a trusted recipient directly communicating with the end users via QKD. Using this network, we perform a proof-of-principle demonstration of QDS mediated by MDI-QKD. For that, we devised an efficient protocol to distil multiple signatures from the same block of data, thus reducing the statistical fluctuations in the sample and greatly enhancing the final QDS rate in the finite-size scenario
Characterization of stochastic orders by L-functionals
Random variables may be compared with respect to their location by comparing certain functionals ad hoc, such as the mean or median, or by means of stochastic ordering based directly on the properties of the corresponding distribution functions. These alternative approaches are brought together in this paper. We focus on the class of L-functionals discussed by Bickel and Lehmann (1975) and characterize the comparison of random variables in terms of these measures by means of several stochastic orders based on iterated integrals, including the increasing convex orde
Liquid-gas phase transition in nuclear multifragmentation
The equation of state of nuclear matter suggests that at suitable beam
energies the disassembling hot system formed in heavy ion collisions will pass
through a liquid-gas coexistence region. Searching for the signatures of the
phase transition has been a very important focal point of experimental
endeavours in heavy ion collisions, in the last fifteen years. Simultaneously
theoretical models have been developed to provide information about the
equation of state and reaction mechanisms consistent with the experimental
observables. This article is a review of this endeavour.Comment: 63 pages, 27 figures, submitted to Adv. Nucl. Phys. Some typos
corrected, minor text change
On tail trend detection: modeling relative risk
The climate change dispute is about changes over time of environmental
characteristics (such as rainfall). Some people say that a possible change is
not so much in the mean but rather in the extreme phenomena (that is, the
average rainfall may not change much but heavy storms may become more or less
frequent). The paper studies changes over time in the probability that some
high threshold is exceeded. The model is such that the threshold does not need
to be specified, the results hold for any high threshold. For simplicity a
certain linear trend is studied depending on one real parameter. Estimation and
testing procedures (is there a trend?) are developed. Simulation results are
presented. The method is applied to trends in heavy rainfall at 18 gauging
stations across Germany and The Netherlands. A tentative conclusion is that the
trend seems to depend on whether or not a station is close to the sea.Comment: 38 page
- …